249 research outputs found

    Probabilistic Deterministic Finite Automata and Recurrent Networks, Revisited

    Get PDF
    Reservoir computers (RCs) and recurrent neural networks (RNNs) can mimic any finite-state automaton in theory, and some workers demonstrated that this can hold in practice. We test the capability of generalized linear models, RCs, and Long Short-Term Memory (LSTM) RNN architectures to predict the stochastic processes generated by a large suite of probabilistic deterministic finite-state automata (PDFA). PDFAs provide an excellent performance benchmark in that they can be systematically enumerated, the randomness and correlation structure of their generated processes are exactly known, and their optimal memory-limited predictors are easily computed. Unsurprisingly, LSTMs outperform RCs, which outperform generalized linear models. Surprisingly, each of these methods can fall short of the maximal predictive accuracy by as much as 50% after training and, when optimized, tend to fall short of the maximal predictive accuracy by ~5%, even though previously available methods achieve maximal predictive accuracy with orders-of-magnitude less data. Thus, despite the representational universality of RCs and RNNs, using them can engender a surprising predictive gap for simple stimuli. One concludes that there is an important and underappreciated role for methods that infer "causal states" or "predictive state representations"

    Fraudulent white noise: Flat power spectra belie arbitrarily complex processes

    Get PDF
    Power spectral densities are a common, convenient, and powerful way to analyze signals, so much so that they are now broadly deployed across the sciences and engineering - from quantum physics to cosmology and from crystallography to neuroscience to speech recognition. The features they reveal not only identify prominent signal frequencies but also hint at mechanisms that generate correlation and lead to resonance. Despite their near-centuries-long run of successes in signal analysis, here we show that flat power spectra can be generated by highly complex processes, effectively hiding all inherent structure in complex signals. Historically, this circumstance has been widely misinterpreted, being taken as the renowned signature of "structureless"white noise - the benchmark of randomness. We argue, in contrast, to the extent that most real-world complex systems exhibit correlations beyond pairwise statistics their structures evade power spectra and other pairwise statistical measures. As concrete physical examples, we demonstrate that fraudulent white noise hides the predictable structure of both entangled quantum systems and chaotic crystals. To make these words of warning operational, we present constructive results that explore how this situation comes about and the high toll it takes in understanding complex mechanisms. First, we give the closed-form solution for the power spectrum of a very broad class of structurally complex signal generators. Second, we demonstrate the close relationship between eigenspectra of evolution operators and power spectra. Third, we characterize the minimal generative structure implied by any power spectrum. Fourth, we show how to construct arbitrarily complex processes with flat power spectra. Finally, leveraging this diagnosis of the problem, we point the way to developing more incisive tools for discovering structure in complex signals

    Thermodynamically-efficient local computation and the inefficiency of quantum memory compression

    Get PDF
    Modularity dissipation identifies how locally implemented computation entails costs beyond those required by Landauer's bound on thermodynamic computing. We establish a general theorem for efficient local computation, giving the necessary and sufficient conditions for a local operation to have zero modularity cost. Applied to thermodynamically-generating stochastic processes it confirms a conjecture that classical generators are efficient if and only if they satisfy retrodiction, which places minimum-memory requirements on the generator. This extends immediately to quantum computation: Any quantum simulator that employs quantum memory compression cannot be thermodynamically efficient

    Inferring hidden Markov models from noisy time sequences: a method to alleviate degeneracy in molecular dynamics

    Get PDF
    We present a new method for inferring hidden Markov models from noisy time sequences without the necessity of assuming a model architecture, thus allowing for the detection of degenerate states. This is based on the statistical prediction techniques developed by Crutchfield et al., and generates so called causal state models, equivalent to hidden Markov models. This method is applicable to any continuous data which clusters around discrete values and exhibits multiple transitions between these values such as tethered particle motion data or Fluorescence Resonance Energy Transfer (FRET) spectra. The algorithms developed have been shown to perform well on simulated data, demonstrating the ability to recover the model used to generate the data under high noise, sparse data conditions and the ability to infer the existence of degenerate states. They have also been applied to new experimental FRET data of Holliday Junction dynamics, extracting the expected two state model and providing values for the transition rates in good agreement with previous results and with results obtained using existing maximum likelihood based methods.Comment: 19 pages, 9 figure
    • …
    corecore